Universum-Inspired Supervised Contrastive Learning

نویسندگان

چکیده

Mixup is an efficient data augmentation method which generates additional samples through respective convex combinations of original points and labels. Although being theoretically dependent on properties, reported to perform well as a regularizer calibrator contributing reliable robustness generalization neural network training. In this paper, inspired by Universum Learning uses out-of-class assist the target tasks, we investigate from largely under-explored perspective - potential generate in-domain that belong none classes, is, universum. We find in framework supervised contrastive learning, universum-style produces surprisingly high-quality hard negatives, greatly relieving need for large batch size learning. With these findings, propose Universum-inspired Contrastive learning (UniCon), incorporates strategy universum g-negatives pushes them apart anchor classes. Our approach not only improves with labels, but also innovates novel measure data. linear classifier learned representations, Resnet-50, our achieves 81.68% top-1 accuracy CIFAR-100, surpassing state art significant margin 5% much smaller size.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Semi-Supervised Classification with Universum

The Universum data, defined as a collection of ”nonexamples” that do not belong to any class of interest, have been shown to encode some prior knowledge by representing meaningful concepts in the same domain as the problem at hand. In this paper, we address a novel semi-supervised classification problem, called semi-supervised Universum, that can simultaneously utilize the labeled data, unlabel...

متن کامل

Selecting Informative Universum Sample for Semi-Supervised Learning

The Universum sample, which is defined as the sample that doesn’t belong to any of the classes the learning task concerns, has been proved to be helpful in both supervised and semi-supervised settings. The former works treat the Universum samples equally. Our research found that not all the Universum samples are helpful, and we propose a method to pick the informative ones, i.e., inbetween Univ...

متن کامل

Time-Contrastive Networks: Self-Supervised Learning from Video

We propose a self-supervised approach for learning representations and robotic behaviors entirely from unlabeled videos recorded from multiple viewpoints, and study how this representation can be used in two robotic imitation settings: imitating object interactions from videos of humans, and imitating human poses. Imitation of human behavior requires a viewpoint-invariant representation that ca...

متن کامل

Weakly-Supervised Learning with Cost-Augmented Contrastive Estimation

We generalize contrastive estimation in two ways that permit adding more knowledge to unsupervised learning. The first allows the modeler to specify not only the set of corrupted inputs for each observation, but also how bad each one is. The second allows specifying structural preferences on the latent variable used to explain the observations. They require setting additional hyperparameters, w...

متن کامل

Supervised Learning with Quantum-Inspired Tensor Networks

Tensor networks are efficient representations of high-dimensional tensors which have been very successful for physics and mathematics applications. We demonstrate how algorithms for optimizing such networks can be adapted to supervised learning tasks by using matrix product states (tensor trains) to parameterize models for classifying images. For the MNIST data set we obtain less than 1% test s...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Lecture Notes in Computer Science

سال: 2023

ISSN: ['1611-3349', '0302-9743']

DOI: https://doi.org/10.1007/978-3-031-25198-6_34